Minimax Estimation of Kernel Mean Embeddings

نویسندگان

  • Ilya O. Tolstikhin
  • Bharath K. Sriperumbudur
  • Krikamol Muandet
چکیده

In this paper, we study the minimax estimation of the Bochner integral μk(P ) := ∫

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

MONK - Outlier-Robust Mean Embedding Estimation by Median-of-Means

Mean embeddings provide an extremely flexible and powerful tool in machine learning and statistics to represent probability distributions and define a semi-metric (MMD, maximum mean discrepancy; also called N-distance or energy distance), with numerous successful applications. The representation is constructed as the expectation of the feature map defined by a kernel. As a mean, its classical e...

متن کامل

Minimax Estimation of Maximum Mean Discrepancy with Radial Kernels

Maximum Mean Discrepancy (MMD) is a distance on the space of probability measures which has found numerous applications in machine learning and nonparametric testing. This distance is based on the notion of embedding probabilities in a reproducing kernel Hilbert space. In this paper, we present the first known lower bounds for the estimation of MMD based on finite samples. Our lower bounds hold...

متن کامل

Kernel density estimation on the rotation group and its application to crystallographic texture analysis

We are concerned with kernel density estimation on the rotation group SO(3). We prove asymptotically optimal convergence rates for the minimax risk of the mean integrated squared error for different function classes including bandlimited functions, functions with bounded Sobolev norm and functions with polynomial decaying Fourier coefficients and give optimal kernel functions. Furthermore, we c...

متن کامل

Conditional mean embeddings as regressors

We demonstrate an equivalence between reproducing kernel Hilbert space (RKHS) embeddings of conditional distributions and vector-valued regressors. This connection introduces a natural regularized loss function which the RKHS embeddings minimise, providing an intuitive understanding of the embeddings and a justification for their use. Furthermore, the equivalence allows the application of vecto...

متن کامل

Variance estimation in nonparametric regression via the difference sequence method (short title: Sequence-based variance estimation)

Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 18  شماره 

صفحات  -

تاریخ انتشار 2017